On the Problem of Grounding a Relational Probabilistic Conditional Knowledge Base

نویسندگان

  • Sebastian Loh
  • Matthias Thimm
  • Gabriele Kern-Isberner
چکیده

For first-order probabilistic knowledge representation, grounding is an important means to define a semantics for knowledge bases which extends the propositional semantics. However, naive approaches to grounding may give rise to conflicts and inconsistencies, in particular, if the formalism involves point probabilities, in contrast to using approximative or interval-based probabilities. In this paper, we formulate properties that can guide the search for suitable grounding operators. Moreover, we present three operators the most sophisticated of which implements a stratified use of a specificity relation so that more specific information on objects is given priority over less specific information.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Relational Probabilistic Conditionals and Their Instantiations under Maximum Entropy Semantics for First-Order Knowledge Bases

For conditional probabilistic knowledge bases with conditionals based on propositional logic, the principle of maximum entropy (ME) is well-established, determining a unique model inductively completing the explicitly given knowledge. On the other hand, there is no general agreement on how to extend the ME principle to relational conditionals containing free variables. In this paper, we focus o...

متن کامل

Generation of Parametrically Uniform Knowledge Bases in a Relational Probabilistic Logic with Maximum Entropy Semantics

In a relational setting, the maximum entropy model of a set of probabilistic conditionals can be defined referring to the full set of ground instances of the conditionals. The logic FO-PCL uses the notion of parametric uniformity to ensure that the full grounding of the conditionals can be avoided, thereby greatly simplifying the maximum entropy model computation. In this paper, we describe a s...

متن کامل

Efficient Inference and Learning in a Large Knowledge Base: Reasoning with Extracted Information using a Locally Groundable First-Order Probabilistic Logic

One important challenge for probabilistic logics is reasoning with very large knowledge bases (KBs) of imperfect information, such as those produced by modern webscale information extraction systems. One scalability problem shared by many probabilistic logics is that answering queries involves “grounding” the query—i.e., mapping it to a propositional representation—and the size of a “grounding”...

متن کامل

Universität Dortmund an der Fakultät für Informatik Matthias Thimm

Reasoning with inaccurate information is a major topic within the fields of artificial intelligence in general and knowledge representation and reasoning in particular. This thesis deals with information that can be incomplete, uncertain, and contradictory. We employ probabilistic conditional logic as a foundation for our investigation. This framework allows for the representation of uncertain ...

متن کامل

A Software System for the Computation, Visualization, and Comparison of Conditional Structures for Relational Probabilistic Knowledge Bases

Combining logic with probabilities is a core idea to uncertain reasoning. Recently, approaches to probabilistic conditional logics based on first-order languages have been proposed that employ the principle of maximum entropy (ME), e.g. the logic FO-PCL. In order to simplify the ME model computation, FO-PCL knowledge bases can be transformed so that they become parametrically uniform. On the ot...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010